Feature extraction based on Laplacian bidirectional maximum margin criterion
نویسندگان
چکیده
Article history: Received 28 July 2008 Received in revised form 2 March 2009 Accepted 9 March 2009
منابع مشابه
Feature extraction using two-dimensional local graph embedding based on maximum margin criterion
In this paper, we propose a novel method for image feature extraction, namely the twodimensional local graph embedding, which is based on maximum margin criterion and thus not necessary to convert the image matrix into high-dimensional image vector and directly avoid computing the inverse matrix in the discriminant criterion. This method directly learns the optimal projective vectors from 2D im...
متن کاملA Recursive Information Gene Selection Using Improved Laplacian Maximum Margin Criterion ⋆
Gene selection is an important research topic in pattern recognition and tumor classification. Numerous methods have been proposed, Maximum Margin Criterion (MMC) is one of the famous methods have been proposed to solve the small size samples problem. But, the MMC only considers the global structure of samples. In this article, a novel recursive gene selection criterion named Laplacian Maximum ...
متن کاملComments on "Efficient and Robust Feature Extraction by Maximum Margin Criterion"
The goal of this comment is to first point out two loopholes in the paper by Li et al. (2006): 1) so-designed efficient maximal margin criterion (MMC) algorithm for small sample size (SSS) problem is problematic and 2) the discussion on the equivalence with the null-space-based methods in SSS problem does not hold. Then, we will present a really efficient MMC algorithm for SSS problem.
متن کاملParallel Feature Extraction through Preserving Global and Discriminative Property for Kernel-Based Image Classification
Kernel-based feature extraction is widely used in image classification, and different kernel methods extract the features based different criterion. KPCA maximizes the determinant of the total scatter matrix of the transformed sample, while KDA seeks the direction of discrimination. KPCA preserves the global property, and KDA utilizes class information to enhance its discriminative ability so a...
متن کاملImproving Chernoff criterion for classification by using the filled function
Linear discriminant analysis is a well-known matrix-based dimensionality reduction method. It is a supervised feature extraction method used in two-class classification problems. However, it is incapable of dealing with data in which classes have unequal covariance matrices. Taking this issue, the Chernoff distance is an appropriate criterion to measure distances between distributions. In the p...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Pattern Recognition
دوره 42 شماره
صفحات -
تاریخ انتشار 2009